52 research outputs found

    Effect of Pre-Processing on Binarization

    Get PDF
    The effects of different image pre-processing methods for document image binarization are explored. They are compared on five different binarization methods on images with bleed through and stains as well as on images with uniform background speckle. The binarization method is significant in the binarization accuracy, but the pre-processing also plays a significant role. The Total Variation method of pre-processing shows the best performance over a variety of pre-processing methods

    Pre-Processing of Degraded Printed Documents by Non-Local Means and Total Variation

    Get PDF
    We compare in this study two image restoration approaches for the pre-processing of printed documents: namely the Non-local Means filter and a total variation minimization approach. We apply these two ap- proaches to printed document sets from various periods, and we evaluate their effectiveness through character recognition performance using an open source OCR. Our results show that for each document set, one or both pre-processing methods improve character recog- nition accuracy over recognition without preprocessing. Higher accuracies are obtained with Non-local Means when characters have a low level of degradation since they can be restored by similar neighboring parts of non-degraded characters. The Total Variation approach is more effective when characters are highly degraded and can only be restored through modeling instead of using neighboring data

    Enhancement of Historical Printed Document Images by Combining Total Variation Regularization and Non-Local Means Filtering

    Get PDF
    This paper proposes a novel method for document enhancement which combines two recent powerful noise-reduction steps. The first step is based on the total variation framework. It flattens background grey-levels and produces an intermediate image where background noise is considerably reduced. This image is used as a mask to produce an image with a cleaner background while keeping character details. The second step is applied to the cleaner image and consists of a filter based on non-local means: character edges are smoothed by searching for similar patch images in pixel neighborhoods. The document images to be enhanced are real historical printed documents from several periods which include several defects in their background and on character edges. These defects result from scanning, paper aging and bleed- through. The proposed method enhances document images by combining the total variation and the non-local means techniques in order to improve OCR recognition. The method is shown to be more powerful than when these techniques are used alone and than other enhancement methods

    Optimal Trajectories of a UAV Base Station Using Hamilton-Jacobi Equations

    Full text link
    We consider the problem of optimizing the trajectory of an Unmanned Aerial Vehicle (UAV). Assuming a traffic intensity map of users to be served, the UAV must travel from a given initial location to a final position within a given duration and serves the traffic on its way. The problem consists in finding the optimal trajectory that minimizes a certain cost depending on the velocity and on the amount of served traffic. We formulate the problem using the framework of Lagrangian mechanics. We derive closed-form formulas for the optimal trajectory when the traffic intensity is quadratic (single-phase) using Hamilton-Jacobi equations. When the traffic intensity is bi-phase, i.e. made of two quadratics, we provide necessary conditions of optimality that allow us to propose a gradient-based algorithm and a new algorithm based on the linear control properties of the quadratic model. These two solutions are of very low complexity because they rely on fast convergence numerical schemes and closed form formulas. These two approaches return a trajectory satisfying the necessary conditions of optimality. At last, we propose a data processing procedure based on a modified K-means algorithm to derive a bi-phase model and an optimal trajectory simulation from real traffic data.Comment: 30 pages, 10 figures, 2 tables. arXiv admin note: substantial text overlap with arXiv:1812.0875

    Leveraging Hamilton-Jacobi PDEs with time-dependent Hamiltonians for continual scientific machine learning

    Full text link
    We address two major challenges in scientific machine learning (SciML): interpretability and computational efficiency. We increase the interpretability of certain learning processes by establishing a new theoretical connection between optimization problems arising from SciML and a generalized Hopf formula, which represents the viscosity solution to a Hamilton-Jacobi partial differential equation (HJ PDE) with time-dependent Hamiltonian. Namely, we show that when we solve certain regularized learning problems with integral-type losses, we actually solve an optimal control problem and its associated HJ PDE with time-dependent Hamiltonian. This connection allows us to reinterpret incremental updates to learned models as the evolution of an associated HJ PDE and optimal control problem in time, where all of the previous information is intrinsically encoded in the solution to the HJ PDE. As a result, existing HJ PDE solvers and optimal control algorithms can be reused to design new efficient training approaches for SciML that naturally coincide with the continual learning framework, while avoiding catastrophic forgetting. As a first exploration of this connection, we consider the special case of linear regression and leverage our connection to develop a new Riccati-based methodology for solving these learning problems that is amenable to continual learning applications. We also provide some corresponding numerical examples that demonstrate the potential computational and memory advantages our Riccati-based approach can provide

    Joint filtering of SAR amplitude and interferometric phase with graph-cuts

    Get PDF
    Like other coherent imaging modalities, synthetic aperture radar (SAR) images suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. values respectively less (sub-figure 1, under-regularized), equal (sub-figure 2) or greater (sub figure 3, over-regularized) than ÎČopt. Section IV-B presents some results of the joint regularization of high-resolution interferometric SAR images on two datasets: a 1200 × 1200 pixels region of interest from Toulouse city, France (figure 5), and a 1024 × 682 pixels region of interest from Saint-Paul sur Mer, France (figure 7). From the regularized images shown, it can be noticed that the noise has been efficiently reduced both in amplitude and phase images. The sharp transitions in the phase image that correspond to man-made structures are well preserved. Joint regularization gives more precise contours than independent regularization as they are co-located from the phase and amplitude images. Small objects also tend to be better preserved by joint-regularization as illustrated in figure 6 which shows an excerpt of a portion of streets with several aligned streetlights visible as brighter dots (higher reflectivity as well as higher altitude). values respectively less (sub-figure 1, under-regularized), equal (sub-figure 2) or greater (sub figure 3, over-regularized) than ÎČopt. Section IV-B presents some results of the joint regularization of high-resolution interferometric SAR images on two datasets: a 1200 × 1200 pixels region of interest from Toulouse city, France (figure 5), and a 1024 × 682 pixels region of interest from Saint-Paul sur Mer, France (figure 7). From the regularized images shown, it can be noticed that the noise has been efficiently reduced both in amplitude and phase images. The sharp transitions in the phase image that correspond to man-made structures are well preserved. Joint regularization gives more precise contours than independent regularization as they are co-located from the phase and amplitude images. Small objects also tend to be better preserved by joint-regularization as illustrated in figure 6 which shows an excerpt of a portion of streets with several aligned streetlights visible as brighter dots (higher reflectivity as well as higher altitude).L’imagerie radar Ă  ouverture synthĂ©tique (SAR), comme d’autres modalitĂ©s d’imagerie cohĂ©rente, souffre de la prĂ©sence du chatoiement (speckle). Cette perturbation rend difficile l’interprĂ©tation automatique des images et le filtrage est souvent une Ă©tape nĂ©cessaire Ă  l’utilisation d’algorithmes de traitement d’images classiques. De nombreuses approches ont Ă©tĂ© proposĂ©es pour filtrer les images corrompues par un bruit de chatoiement. La modĂ©lisation par champs de Markov (CdM) fournit un cadre adaptĂ© pour exprimer Ă  la fois les contraintes sur l’attache aux donnĂ©es et les propriĂ©tĂ©s dĂ©sirĂ©es sur l’image filtrĂ©e. Dans ce contexte la minimisation de la variation totale a Ă©tĂ© abondamment utilisĂ©e afin de limiter les oscillations dans l’image rĂ©gularisĂ©e tout en prĂ©servant les bords. Le bruit de chatoiement suit une distribution de probabilitĂ© Ă  queue lourde et la formulation par CdM conduit Ă  un problĂšme de minimisation mettant en jeu des attaches aux donnĂ©es non-convexes. Une telle minimisation peut ĂȘtre obtenue par une approche d’optimisation combinatoire en calculant des coupures minimales de graphes. Bien que cette optimisation puisse ĂȘtre menĂ©e en thĂ©orie, ce type d’approche ne peut ĂȘtre appliquĂ© en pratique sur les images de grande taille rencontrĂ©es dans les applications de tĂ©lĂ©dĂ©tection Ă  cause de leur grande consommation de mĂ©moire. Le temps de calcul des algorithmes de minimisation approchĂ©e (en particulier α-extension) est gĂ©nĂ©ralement trop Ă©levĂ© quand la rĂ©gularisation jointe de plusieurs images est considĂ©rĂ©e. Nous montrons qu’une solution satisfaisante peut ĂȘtre obtenue, en quelques itĂ©rations, en menant une exploration de l’espace de recherche avec de grands pas. Cette derniĂšre est rĂ©alisĂ©e en utilisant des techniques de coupures minimales. Cet algorithme est appliquĂ© pour rĂ©gulariser de maniĂšre jointe Ă  la fois l’amplitude et la phase interfĂ©romĂ©trique d’images SAR en milieu urbain
    • 

    corecore